AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Semantic Relation Classification

# Semantic Relation Classification

Relbert Roberta Large
RelBERT is a model based on RoBERTa-large, specifically designed for relation embedding tasks, trained on the SemEval-2012 Task 2 dataset using NCE (Noise Contrastive Estimation).
Text Embedding Transformers
R
relbert
97
2
Mnli 1
BERT is a pre-trained language model based on the Transformer architecture, developed by Google. This model excels in various natural language processing tasks, including text classification, question answering, and named entity recognition.
Text Classification Transformers
M
kangnichaluo
14
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase